Constructing Robust Neural Decoders Using Limited Training Data

نویسندگان

  • Shamim Nemati
  • Nicholas G. Hatsopoulos
  • Lee E. Miller
  • Andrew H. Fagg
چکیده

One of the essential components of a neuromotor prosthetic device is a neural decoder that translates the activity of a set of neurons into an estimate of the intended movement of the prosthetic limb. Wiener filter style approaches model this transformation as a linear function of the number of spikes observed from a set of neurons and over a range of distinct time bins. More recently, researchers have employed recursive Bayesian estimation techniques, such as Kalman filters, and have reported substantially better performance than with the Wiener filter. It is argued that this improvement in performance is due to the compact nature of these Bayesian models. Our results show that the poor performance of the Wiener filter is restricted to cases in which small training data sets are used, leading to substantial model overfitting. However, when training data sets are larger, we show that the Wiener filter is able to make appropriate use of the additional degrees of freedom to consistently outperform the Kalman filter. Finally, we suggest an alternative to the standard pseudo-inverse approach to solving for the Wiener filter parameters. The resulting algorithm almost always outperforms both of the previous approaches independent of the data set size.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Making brain–machine interfaces robust to future neural variability

A major hurdle to clinical translation of brain–machine interfaces (BMIs) is that current decoders, which are trained from a small quantity of recent data, become ineffective when neural recording conditions subsequently change. We tested whether a decoder could be made more robust to future neural variability by training it to handle a variety of recording conditions sampled from months of pre...

متن کامل

Corrigendum: Making brain-machine interfaces robust to future neural variability

A major hurdle to clinical translation of brain-machine interfaces (BMIs) is that current decoders, which are trained from a small quantity of recent data, become ineffective when neural recording conditions subsequently change. We tested whether a decoder could be made more robust to future neural variability by training it to handle a variety of recording conditions sampled from months of pre...

متن کامل

Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted train...

متن کامل

Robust Backstepping Control of Induction Motor Drives Using Artificial Neural Networks and Sliding Mode Flux Observers

In this paper, using the three-phase induction motor fifth order model in a stationary twoaxis reference frame with stator current and rotor flux as state variables, a conventional backsteppingcontroller is first designed for speed and rotor flux control of an induction motor drive. Then in orderto make the control system stable and robust against all electromechanical parameter uncertainties a...

متن کامل

Regularizing Prediction Entropy Enhances Deep Learning with Limited Data

Many supervised learning problems require learning with small amounts of training data, since constructing large training datasets could be impractical due to cost, labor, or unavailability of data. For such tasks, constructing deep learning approaches that generalize to new data is difficult. In this paper, we demonstrate the effectiveness of using entropy as a regularizer on image classificat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007